The Global Optimization Geometry of Low-Rank Matrix Optimization

نویسندگان

چکیده

This paper considers general rank-constrained optimization problems that minimize a objective function ${f}( {X})$ over the set of rectangular notation="LaTeX">${n}\times {m}$ matrices have rank at most r. To tackle constraint and also to reduce computational burden, we factorize notation="LaTeX">$ {X}$ into {U} {V} ^{\mathrm {T}}$ where {U}$ {V}$ are {r}$ notation="LaTeX">${m}\times matrices, respectively, then optimize small . We characterize global geometry nonconvex factored problem show corresponding satisfies robust strict saddle property as long original f restricted strong convexity smoothness properties, ensuring convergence many local search algorithms (such noisy gradient descent) in polynomial time for solving problem. provide comprehensive analysis matrix factorization aim find such approximates given {X}^\star $ Aside from property, has no spurious minima obeys not only exact-parameterization case notation="LaTeX">$\mathrm {rank}( {X}^\star) = , but over-parameterization < under-parameterization > These geometric properties imply number iterative converge solution with random initialization.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The Global Optimization Geometry of Low-Rank Matrix Optimization

In this paper we characterize the optimization geometry of a matrix factorization problem where we aim to find n×r and m×r matrices U and V such that UV T approximates a given matrixX. We show that the objective function of the matrix factorization problem has no spurious local minima and obeys the strict saddle property not only for the exact-parameterization case where rank(X) = r, but also f...

متن کامل

The Non-convex Geometry of Low-rank Matrix Optimization

This work considers the minimization of a general convex function f(X) over the cone of positive semidefinite matrices whose optimal solution X⋆ is of low-rank. Standard first-order convex solvers require performing an eigenvalue decomposition in each iteration, severely limiting their scalability. A natural nonconvex reformulation of the problem factors the variable X into the product of a rec...

متن کامل

Low-Rank Matrix Completion by Riemannian Optimization

The matrix completion problem consists of finding or approximating a low-rank matrix based on a few samples of this matrix. We propose a novel algorithm for matrix completion that minimizes the least square distance on the sampling set over the Riemannian manifold of fixed-rank matrices. The algorithm is an adaptation of classical non-linear conjugate gradients, developed within the framework o...

متن کامل

Global optimization for structured low rank approximation

In this paper, we investigate the complexity of the numerical construction of the Hankel structured low-rank approximation (HSLRA) problem, and develop a family of algorithms to solve this problem. Briefly, HSLRA is the problem of finding the closest (in some pre-defined norm) rank r approximation of a given Hankel matrix, which is also of Hankel structure. Unlike many other methods described i...

متن کامل

Fixed-rank matrix factorizations and Riemannian low-rank optimization

Motivated by the problem of learning a linear regression model whose parameter is a large fixed-rank non-symmetric matrix, we consider the optimization of a smooth cost function defined on the set of fixed-rank matrices. We adopt the geometric framework of optimization on Riemannian quotient manifolds. We study the underlying geometries of several well-known fixed-rank matrix factorizations and...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Information Theory

سال: 2021

ISSN: ['0018-9448', '1557-9654']

DOI: https://doi.org/10.1109/tit.2021.3049171